- Language models like ChatGPT only produce likely sequences of words, ‘If the text makes sense it is because we, the reader, make sense of it”
- Systems cannot learn meaning but only form of language.
- The linguistic forms created by these systems don’t carry meaning except to those who know the linguistic system.
- Thought experiment: trapped in a library where everything is only in Thai and there are no images could one learn the language and understand the meaning of words? The answer is no if you do not use outside understandings like a translated book that you know in English. If you were there long enough you might be able to create a string of words a Thai person could understand but you would not actually know what it means, just like a language model does not actually know the meaning of what it produces.
- Fluency in language does not equate to intelligence.
Backlinks
-
Old Index
I got a bit out of step in adding your notes under appropriate weekly headings somewhere around the last quarter of the term...
Week 1
Week 2
Language models can only write poetry - Allison Parish (TO)
[[note on
-
About These Notes
Your Notes
I used a topic model to index your notes...
Topic 1: language model models word words text chatgpt make based human probability meaning produce poetry work probabilities experiment understanding understand neural
- [[Fr